15 research outputs found

    Modelo programable para la serialización y evaluación de modelos heterogéneos en clientes web

    Full text link
    Tesis doctoral inédita leída en la Universidad Autónoma de Madrid, Escuela Politécnica Superior, Departamento de Ingeniería Informática. Fecha de lectura : 6-07-201

    An approach to build JSON-based Domain Specific Languages solutions for web applications

    Full text link
    Because of their level of abstraction, Domain-Specific Languages (DSLs) enable building applications that ease software implementation. In the context of web applications, we can find a lot of technologies and programming languages for server-side applications that provide fast, robust, and flexible solutions, whereas those for client-side applications are limited, and mostly restricted to directly use JavaScript, HTML5, CSS3, JSON and XML. This article presents a novel approach to creating DSL-based web applications using JSON grammar (JSON-DSL) for both, the server and client side. The approach includes an evaluation engine, a programming model and an integrated web development environment that support it. The evaluation engine allows the execution of the elements created with the programming model. For its part, the programming model allows the definition and specification of JSON-DSLs, the implementation of JavaScript components, the use of JavaScript templates provided by the engine, the use of link connectors to heterogeneous information sources, and the integration with other widgets, web components and JavaScript frameworks. To validate the strength and capacity of our approach, we have developed four case studies that use the integrated web development environment to apply the programming model and check the results within the evaluation engin

    Laser lithotripsy fundamentals: from the physics to optimal fragmentation

    Get PDF
    Purpose: Laser Lithotripsy has remained the cornerstone for the management of urolithiasis for more than thirty years. Miniaturization of endoscopic equipment, digital vision, improvement of laser lithotripters, laser fibers has made endourology a field of growing interest, immersed in a technologic revolution. The aim of this article is to do an extense review on laser lithotripsy starting from the physics of the lasers, to translational science apply to lithotripsy fundamentals in order to make lithotripsy safer and more efficient. Methods: We performed a review of the literature in four different databases (PubMed, Embase, Ovid® , and Scopus® ) on any information concerning laser lithotripsy in February 2020 independently by three authors, a total of 186 articles were reviewed and 38 of the most influential articles were selected and a detailed reviewed on this topic is presented. Results: We aim to make a reference paper for all urologists and health personal involved in laser lithotripsy, starting from the physics to answer practical questions as how to set the parameters in my laser system, how to improve lithotripsy efficiency, should we dust or bust? and finally discussing new technologies such as the Holmium: YtriumAluminium-Garnet (Ho:YAG) Moses technology, the revolutionary Thulium Laser Fiber (TLF) and discussing the future of laser lithotripsy. Conclusions: Laser lithotripsy must offer. Higher ablative efficiency, wider range of laser parameters and comprehensive combinations, reduce retropulsion and fiber burnback, scope miniaturization capabilities, smaller fiber sizes, increased safety, lower environmental impact, reduced maintenance costs Ho:YAG has remained the unquestioned gold standard for laser lithotripsy, but the recently launched Thulium fiber laser has all the above mentioned features and outruns without no doubt the current gold standard and is set to gradually replace it.https://orcid.org/0000-0002-5895-3029https://orcid.org/0000-0003-0363-5485Revista Internacional - No indexadaN

    Simple time-biased KNN-based recommendations

    Full text link
    This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CAMRa '10 Proceedings of the Workshop on Context-Aware Movie Recommendation, http://dx.doi.org/10.1145/1869652.1869655.In this paper, we describe the experiments conducted by the Information Retrieval Group at the Universidad Autónoma de Madrid (Spain) in order to better recommend movies for the 2010 CAMRa Challenge edition. Experiments were carried out on the dataset corresponding to weekly Filmtipset track. We consider simple strategies for taking into account the temporal context for movie recommendations, mainly based on variations of the KNN algorithm, which has been deeply studied in the literature, and one ad-hoc strategy, taking advantage of particular information in the weekly Filmtipset track. Results show that the usage of information near to the recommendation date alone can help improving recommendation results, with the additional benefit of reducing the information overload of the recommender engine. Furthermore, the use of social interaction information shows also a contribution in order to better predict a part of users' tastes.This research was supported by the Spanish Ministry of Science and Innovation (TIN2008-06566-C04-02) and the Scientific Computing Institute at UAM. The first author acknowledges support from the Chilean Government through the Becas-Chile scholarship progra

    Movie recommendations based in explicit and implicit features extracted from the filmtipset dataset

    Full text link
    This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CAMRa '10 Proceedings of the Workshop on Context-Aware Movie Recommendation, http://dx.doi.org/10.1145/1869652.1869660In this paper, we describe the experiments conducted by the Information Retrieval Group at the Universidad Autónoma de Madrid (Spain) in order to better recommend movies for the 2010 CAMRa Challenge edition. Experiments were carried out on the dataset corresponding to social Filmtipset track. To obtain the movies recommendations we have used different algorithms based on Random Walks, which are well documented in the literature of collaborative recommendation. We have also included a new proposal in one of the algorithms in order to get better results. The results obtained have been computed by means of the trec_eval standard NIST evaluation procedure.This research was supported by the Spanish Ministry of Science and Innovation (TIN2008-06566-C04-02) and the Scientific Computing Institute at UAM. The third author also wants to acknowledge support from the Chilean Government through the Becas-Chile scholarship progra

    An empirical comparison of social, collaborative filtering, and hybrid recommenders

    Full text link
    This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in ACM Transactions on Intelligent Systems and Technology, http://dx.doi.org/10.1145/2414425.2414439In the Social Web, a number of diverse recommendation approaches have been proposed to exploit the user generated contents available in the Web, such as rating, tagging, and social networking information. In general, these approaches naturally require the availability of a wide amount of these user preferences. This may represent an important limitation for real applications, and may be somewhat unnoticed in studies focusing on overall precision, in which a failure to produce recommendations gets blurred when averaging the obtained results or, even worse, is just not accounted for, as users with no recommendations are typically excluded from the performance calculations. In this article, we propose a coverage metric that uncovers and compensates for the incompleteness of performance evaluations based only on precision. We use this metric together with precision metrics in an empirical comparison of several social, collaborative filtering, and hybrid recommenders. The obtained results show that a better balance between precision and coverage can be achieved by combining social-based filtering (high accuracy, low coverage) and collaborative filtering (low accuracy, high coverage) recommendation techniques. We thus explore several hybrid recommendation approaches to balance this trade-off. In particular, we compare, on the one hand, techniques integrating collaborative and social information into a single model, and on the other, linear combinations of recommenders. For the last approach, we also propose a novel strategy to dynamically adjust the weight of each recommender on a user-basis, utilizing graph measures as indicators of the target user's connectedness and relevance in a social network.This work was supported by the Spanish Ministry of Science and Innovation (TIN2008-06566-C04-02), Universidad Autonoma de Madrid (CCG10-UAM/TIC-5877), and the Scientific Computing Institute at UAM

    Automated migration of EuGENia graphical editors to the web

    Full text link
    © ACM 2020. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in Proceedings of the 23rd ACM/IEEE International Conference on Model Driven Engineering Languages and Systems: Companion Proceedings, http://dx.doi.org/10.1145/10.1145/3417990.3420205Domain-specific languages (DSLs) are languages tailored for particular domains. Many frameworks and tools have been proposed to develop editors for DSLs, especially for desktop IDEs, like Eclipse. We are witnessing the advent of low-code development platforms, which are cloud-based environments supporting rapid application development by using graphical languages and forms. While this approach is very promising, the creation of new low-code platforms may require the migration of existing desktop-based editors to the web. However, this is a technically challenging task. To fill this gap, we present ROCCO, a tool that migrates Eclipse-based graphical modelling editors to the web, to facilitate their integration with low-code platforms. The tool reads a meta-model annotated with EuGENia annotations, and generates a web editor using the DPG web framework used by the UGROUND company. In this paper, we present the approach, including tool support and an evaluation based on migrating nine editors created by third parties, which shows the usefulness of the toolThis project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement n° 813884, Lowcomote [33]. The work has also been supported by the Spanish Ministry of Science (project MASSIVE, RTI2018-095255-B-I00) and the R&D programme of Madrid (project FORTE, P2018/TCS-4314

    Complete study for erectile dysfunction (CompED) improving diagnosis and treatment decision-making

    Get PDF
    INTRODUCCIÓN: La disfunción eréctil (DE) es una condición asociada con el envejecimiento y su prevalencia global se ha estimado entre 18 a 47%. La evaluación de los pacientes y su tratamiento debe seguir un abordaje escalonado. El objetivo de este articulo es reportar nuestra experiencia con el estudio completo de impotencia, que incluye una prueba de erección con agente intracavernoso, biotesiometría peneana y ecografía doppler color en un mismo tiempo, despues de documentar falla al tratamiento oral. METODOS: Ciento ochenta y siete pacientes se sometieron al estudio. La recolección de los datos y el análisis se hizo de manera prospectiva. Para el análisis univariado descriptivo se utilizaron medidas de tendencia central y dispersión. Para el análisis bivariado se calcularon p valores, con una prueba de Fisher. Para el análisis multivariado se utilizaron árboles de decisión binarios cuyos nodos fueron divididos con base al coeficiente de Gini. Se utilizo el coeficiente de correlación de Pearson y se reporto con diagramas de dispersión pareados para decisión de tratamiento. Se utilizo R Studio versión 4.0.0. RESULTADOS: Entre el 2017 y el 2020, 187 pacientes con fueron llevados al estudio completo de impotencia. La media de edad fue 57 +/- 12.8 años, la mediana de seguimiento desde el diagnostico de DE fue 24 (IQR 12-240) meses, la mediana del dominio A del IIEF-15 fue 6 (IQR 1-30). A 39 pacientes (20.6%) se les ofreció una segunda prueba con IPDE5, a 77 (40.7%) continuar con terapia intracavernosa y a 73 (38.6%) manejo quirúrgico. Encontramos una correlación fuerte entre tumescencia y rigidez axial en todas las decisiones para tratamiento en la prueba de correlación de Pearson. El análisis multivariado y los arboles de decisión evidenciaron que una Velocidad de pico sistólico < 17.5 cm/s, tumescencia < 35%, Índice de resistencia <0.74, edad ≥60.5 influencio la decisión de ofrecer prótesis peneana y el mejor predictor para ligadura venosa fue una velocidad de fin de diástole ≥ 9.25 cm/s con sensibilidad de 92% y 86% especificidad. CONCLUSIÓNES: La DE es una enfermedad de alta prevalencia. Pruebas especializadas se deben considerar en pacientes seleccionados o que no responden a la primera línea de tratamiento. El estudio completo de impotencia se presenta como una nueva alternativa para la evaluación de pacientes con DE, consumiendo menor tiempo y ayudando a realizar una determinación mas precisa con respecto a la etiología de la DE y así facilitando la toma de decisiones para el tratamiento.INTRODUCTION: ED is a condition associated with increasing age and its overall prevalence has been estimated at 18 to 47%. It is associated with numerous comorbidities and lifestyle attributes. Patient evaluation and management should follow a comprehensive, stepwise approach. The aim of this article is to report our experience with a Complete study for ED (CompED) including ICI rigidity test, biothesiometry and color duplex doppler ultrasound (CDDUS) after oral therapy failure. METHODS: One hundred and eighty-seven patients were recruited. Data was collected and analysed prospectively. For descriptive univariate analysis central tendency and dispersion measures were used. For bivariate analysis, p values were calculated, with fisher and a chi-square test. Multivariate analysis was performed using binary decision trees, their respective separating nodes where divided according to the Gini coefficient. Pearson correlation coefficient (PCC) was also used and reported through dispersion diagrams for pairwise correlation to determine the probability of treatment decision-making. R Studio version 4.0.0. was used for statistical calculations. RESULTS: Between May 2017 and January 2020, 187 patients with ED, underwent the CompED test. Mean age was 57 +/- 12.8 years, median follow up was 24 (IQR 12-240) months. Median IIEF-15 Domain A was 6 (IQR 1-30). We divided the patients in subgroups: diabetes, coronary artery disease, prostate cancer treated with radical prostatectomy, radiotherapy or ADT; spinal cord injury, pelvic trauma, HIV. Treatment decision making was eased by the CompED test, 39 (20.6%) of patients were offered a second trial of PD5-I with a daily dose combined with a demand dose. 77 (40.7%) continued with ICI injections and 73 (38.6%) were offered surgery. We found a strong correlation between tumescence and axial rigidity in all treatments decision-making. Multivariate analysis treatment decision tree showed that PSV < 17.5 cm/s, tumescence< 35 %, RI <0.74, age ≥60.5 influenced the decision to offer a penile prosthesis and the best predictor for penile venous surgery was an EDV ≥ 9.25 cm/s with 92% sensitivity, 86% specificity. CONCLUSION: ED is a high prevalence disease. Specialized testing should be considered in selected patients or patients unresponsive to first line treatment. The CompED test stands as a new alternative for the evaluation of patients with ED, improving, being less time consuming and aiding in a more accurate determination of the aetiology and guiding treatment decision-making.Especialista en UrologíaEspecializaciónhttps://orcid.org/0000-0003-0363-5485https://scholar.google.com/citations?hl=es&view_op=list_works&gmla=AJsN-F6WK3x6pY5rgTfs-NCsphwkrbHY_aQ80Us0NZVwZw64sWQR0gEKIgfoF0csnKkJp7-UUEDZuBkUrvFEHRyb068Yk4YSJq9l4YB5rQ-lD0-Fzp-F8WSjFMkvoSyPhZwPbev7aETwQ2W_-tMD-ks5huznPrrr0u27Vrg9xyxgvJahwZ3M9u2K6WnFVZRtyzuIqM_LjAEJ&user=uXzhgnIAAAAJhttps://scienti.minciencias.gov.co/cvlac/ConfirmDatos/return.d

    PsiLight: a Lightweight Programming Language to Explore Multiple Program Execution and Data-binding in a Web-Client DSL Evaluation Engine

    No full text
    Domain-Specific Languages (DSLs) allow building software applications by simplifying the labour of both software engineers and domain experts thanks to the abstraction provided by a high-level code. Introducing a DSL in the software development process requires the use of technologies and frameworks in the design and implementation activities. If we are restricted to web-client applications, then XML-based languages and JavaScript frameworks and widgets are commonly used and combined in order to provide fast, robust and flexible solutions. Under this scenario, we have developed the PsiEngine, an interpreter able to evaluate programs coded in high-level XML-based DSLs (XML-DSLs) to provide solutions to domain specific problems within a web-client application. Thus, the goal of this article is to detail how we have built PsiLight, a lightweight programming language that runs on web-client. PsiLight supposes the exploratory case study we have conducted to check some features of PsiEngine, namely: multiple programs execution and data-binding capabilities in our interpreter
    corecore